Faster SVM training via conjugate SMO

نویسندگان

چکیده

We propose an improved version of the SMO algorithm for training classification and regression SVMs, based on a Conjugate Descent procedure. This new approach only involves modest increase computational cost each iteration but, in turn, usually results substantial decrease number iterations required to converge given precision. Besides, we prove convergence iterates this as well linear rate when kernel matrix is positive definite. have implemented within LIBSVM library show experimentally that it faster many hyper-parameter configurations, being often better option than second order performing grid-search SVM tuning.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Network Intrusion detection by using PCA via SMO-SVM

As network attacks have increased in number and severity over the past few years, intrusion detection system (IDS) is increasingly becoming a critical component to secure the network. Due to large volumes of security audit data as well as complex and dynamic properties of intrusion behaviors, optimizing performance of IDS becomes an important open problem that is receiving more and more attenti...

متن کامل

On the Equivalence of the SMO and MDM Algorithms for SVM Training

SVMtraining is usually discussed under twodifferent algorithmic points of view. The first one is provided by decomposition methods such as SMO and SVMLight while the second one encompasses geometric methods that try to solve a Nearest Point Problem (NPP), the Gilbert– Schlesinger–Kozinec (GSK) andMitchell–Demyanov–Malozemov (MDM) algorithms being the most representative ones. In this work we wi...

متن کامل

Improvements to Platt's SMO Algorithm for SVM Classifier Design

This article points out an important source of inefficiency in Platt’s sequential minimal optimization (SMO) algorithm that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO. These modified algorithms perform significantly faster than the original SMO on all benchmark d...

متن کامل

Improvements to the SMO algorithm for SVM regression

This paper points out an important source of inefficiency in Smola and Schölkopf's sequential minimal optimization (SMO) algorithm for support vector machine (SVM) regression that is caused by the use of a single threshold value. Using clues from the KKT conditions for the dual problem, two threshold parameters are employed to derive modifications of SMO for regression. These modified algorithm...

متن کامل

Second-Order SMO Improves SVM Online and Active Learning

Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Pattern Recognition

سال: 2021

ISSN: ['1873-5142', '0031-3203']

DOI: https://doi.org/10.1016/j.patcog.2020.107644